Release Notes for Q2 2024

Explore the new features and enhancements added to this update!

Updated in: July 2024

Release Version: 1.18

Feature/ Enhancement Description
Support for Salesforce integration with Amazon AppFlow to ingest data into a Snowflake table With this release, we now support ingesting Salesforce data into a Snowflake table by using Amazon AppFlow. You can add tags to the AppFlow objects that are stored in AWS as a key-value pair. Tags help you to manage, organize, search, and filter resources. See Data Ingestion from Salesforce to Snowflake using Amazon AppFlow.
Support for creating branch templates for custom code

With this release, we now support creating branch templates in the source code repository for custom code for transformation jobs created in Databricks and Snowflake. You can promote and deploy a data pipeline to the required branch, depending on the defined branching structure.

See Databricks Custom Transformation Job.

See Snowflake Custom Transformation Job.

Filtering and preview of data is now enabled in data crawlers and data catalogs

With this release, you can now apply specific conditions to table columns during data crawling to filter data according to your needs. Afterward, you can create a catalog with the filtered data and preview it in the Data Crawler Preview. You can apply additional filters on the catalog that is created and view it in the Data Catalog Preview. You can use the customized catalogs in data pipelines.

See Data Crawler and Data Catalog.

Support for triggering a pipeline by listening to notifications from Amazon SQS events

Data Pipeline Studio now supports triggering a pipeline by listening to notifications from Amazon SQS events. The pipeline consumes SQS events from AWS and triggers a pipeline run, or terminates a pipeline run based on the notifications.

See Trigger a pipeline based on SQS notifications.

Renaming of steps in Snowflake and Databricks data integration, data transformation, and data quality jobs

The following steps in data integration, data transformation, and data quality jobs have been renamed: 

Job Existing Name New Name
Data Integration (templatized job) Column Mapping Schema Mapping
Data Mapping Data Management
Data Integration (custom job) Custom Parameters Additional Parameters
Data Transformation (templatized job) Column Mapping Schema Mapping
Data Transformation (custom job) Custom Parameters Additional Parameters
Data Quality Issue Resolver Data Issue Resolver
Snowflake Partner ID implementation

With this release, the Snowflake Partner Program telemetry is enabled in the Lazsa Platform. The default toggle is “ON” which can be disabled manually.

See Configuring Snowflake Connection Details.

Databricks Wheel Package upgrade available

The Lazsa Platform periodically upgrades the Databricks wheel package with an indication on the Data Pipeline Studio UI that an upgrade is available.

In this release, the wheel package is upgraded to the following version:

  • Databricks Wheel Package - 1.0.31

See Databricks Wheel Package Management.

Support for the latest versions of technologies

Support for the following latest versions of technologies is now available in the Lazsa Platform:

  • Django 5.0.4

  • Java 21 Spring Boot 3.2 - Maven

  • Java 21 Spring Boot 3.1 - Gradle

  • Java 21 Spring Boot 3.1 - Maven

  • Java 17 with GraphQL Spring Boot 3.2 - Gradle

  • Java 17 with GraphQL Spring Boot 3.2 - Maven

  • Java 17 with GraphQL Spring Boot 3.1 - Gradle

  • Java 17 with GraphQL Spring Boot 3.1 – Maven

To view all the supported tools and technologies, see Tools and Technologies Integrated with Lazsa Platform.

User Provisioning support for GitLab You can provision users into GitLab repositories from within the Lazsa Platform. Simply create a Product role having the required GitLab project permissions and assign it to the intended teams or individual users that are part of your product team in Lazsa.
New version of Lazsa Orchestrator Agent available A new version (1.1.84) of the Lazsa Orchestrator Agent with simplified installation steps is now available. Manual steps like creating a namespace in an EKS or AKS cluster, attaching the server certificate to the Ingress object, and creating a Docker registry secret are now included in the installation command. Upgrade to the latest version for a seamless experience. For upgrade steps, see Updating Lazsa Orchestrator Agent .